AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Sparse mixture of experts

# Sparse mixture of experts

Turkish Deepseek
Apache-2.0
A language model trained on Turkish text based on the DeepSeek architecture, incorporating Multi-Head Latent Attention (MLA) and Mixture of Experts (MoE) technologies.
Large Language Model Transformers Other
T
alibayram
106
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase